155 research outputs found

    Estimating Answer Sizes for XML Queries

    Get PDF
    Abstract. Estimating the sizes of query results, and intermediate results, is crucial to many aspects of query processing. In particular, it is necessary for effective query optimization. Even at the user level, predictions of the total result size can be valuable in “next-step ” decisions, such as query refinement. This paper proposes a technique to obtain query result size estimates effectively in an XML database. Queries in XML frequently specify structural patterns, requiring specific relationships between selected elements. Whereas traditional techniques can estimate the number of nodes (XML elements) that will satisfy a node-specific predicate in the query pattern, such estimates cannot easily be combined to provide estimates for the entire query pattern, since element occurrences are expected to have high correlation. We propose a solution based on a novel histogram encoding of element occurrence position. With such position histograms, we are able to obtain estimates of sizes for complex pattern queries, as well as for simpler intermediate patterns that may be evaluated in alternative query plans, by means of a position histogram join (pH-join) algorithm that we introduce. We extend our technique to exploit schema information regarding allowable structure (the no-overlap property) through the use of a coverage histogram. We present an extensive experimental evaluation using several XML data sets, both real and synthetic, with a variety of queries. Our results demonstrate that accurate and robust estimates can be achieved, with limited space, and at a miniscule computational cost. These techniques have been implemented in the context of the TIMBER native XML database [22] at the University of Michigan.

    Towards Building Wind Tunnels for Data Center Design

    Get PDF
    Data center design is a tedious and expensive process. Recently, this process has become even more challenging as users of cloud services expect to have guaranteed levels of availability, durability and performance. A new challenge for the service providers is to find the most cost-effective data center design and configuration that will accommodate the users ’ expectations, on ever-changing workloads, and constantly evolving hardware and software components. In this paper, we argue that data center design should become a systematic process. First, it should be done using an integrated approach that takes into account both the hardware and the software interdependencies, and their impact on users ’ expectations. Second, it should be performed in a “wind tunnel”, which uses large-scale simulation to systematically explore the impact of a data center configuration on both the users ’ and the service providers ’ requirements. We believe that this is the first step towards systematic data center design – an exciting area for future research. 1

    Effect of metabolized polyethylene terephthalate, vacuum packaging and storage temperature on shelf life of papaya pulp Kalakand (Indian cookie)

    Get PDF
    The dairy plants are looking for newer products for diversification and value addition. There is scope for the dairy industry to introduce newer products as healthy, convenience and ready to eat foods for capacity utilization and value addition, but because of complex biochemical composition and high water content, milk and milk products act as an excellent culture medium for growth and multiplication of varieties of microorganisms. Vacuum packaging reduces product shrinkage, trim losses by eliminating oxidation and freezer burn resulting it can enhance product quality. Now a day metabolized polyethylene terephthalate (MET PET) with vacuum packaging have a promising role in storage of various value added milk product. The developed value added Kalakand product (Indian cookie) could be stored successfully for 5 days in MET PET packaging material at 4±1°C and when the product was packaged under vacuum the shelf life increased up to 10 days at 4±1°C

    Efficacy and superiority of an innovative method (IM) of intravenous (IV) fluid drip drop rate calculation using IV set and its comparison with conventional methods (CM)

    Get PDF
    Background: Almost every indoor patient requires some form of intravenous (IV) fluids and its infusion rate should be proper as recommended for best treatment outcomes. To overcome the same, a simple, quick and easily applicable new method for drip drop rate calculation is proposed, which is user-friendly at bedside and doesn’t require mathematical skills or help.Methods: Author compared this novel innovative method (IM) of IV fluid drip drop rate method for both regular macro and micro drop infusion set against conventional mathematical calculation method (MC) of infusion in various IV fluid indoor orders and assessed for time-to-initiation of treatment (TI) required and its accuracy. Ten resident doctors and ten nursing staff participated to grade both conventional and novel methods by using pre-printed forms of various parameters like time consumption, comfort level, accuracy and applicability in ward and these both methods were scored on a scale of 1 to 10.Results: Conventional method (CM) required 14.23±1.10seconds, while novel method (IM) required average 3.63±0.73seconds for calculation of drop rate. Average grading for conventional method was 3.63±0.49 and for novel method was 7.84±0.6 out of 10.Conclusions: Novel method of IV fluid drip drop rate formula is easy, quick and superior in comparison to conventional method and it doesn’t require any additional instrumental help. It is good alternative to conventional formula for IV drip drop rate calculation in absence of infusion pump
    • …
    corecore